4 minute read | May.21.2025
On May 19, President Donald Trump signed into law the bipartisan-supported TAKE IT DOWN Act (S.146).[1] While almost forty U.S. states have enacted some form of legislation targeting online abuse, the TAKE IT DOWN Act is the first significant bipartisan federal legislation to provide a baseline level of protection against the spread of non-consensual intimate imagery (NCII), including AI-generated deepfake pornography and so-called “revenge porn.”[2]
Digital platforms have one year, until May 19, 2026, to establish appeal and removal processes.
The TAKE IT DOWN Act criminalizes the publication of NCII as well as realistic, computer-generated intimate images depicting identifiable individuals without their consent. It also mandates that social media platforms, online forums, hosting services and other tech companies that facilitate user-generated content remove such content within 48 hours of a valid request from the affected individual. The Federal Trade Commission is empowered to enforce these requirements and hold online platforms accountable, treating non-compliance as a deceptive or unfair practice under federal consumer protection law.
The Act introduces significant new legal obligations for online platforms, particularly those that host or distribute user-generated content. These changes will require most platforms to reassess their content moderation workflows, takedown procedures and liability exposure.
Under the new rules, platforms must:
The Act does not explicitly require platforms to proactively monitor or filter all content, however it does require “reasonable efforts to identify and remove any known identical copies” of the intimate visual depiction. It is unclear how “reasonable efforts” will be interpreted, so platforms are well-advised to consider using duplicate detection tools/image hashing. Additionally, although the Act does not create a continuing duty to police future content uploads beyond the removal of the reported image and identical copies specifically identified in valid reports, broader moderation practices may be helpful in mitigating legal and reputational risks with enforcement authorities and users.
Online platforms should begin assessing their readiness now in anticipation of the May 19, 2026 compliance deadline to reduce risk exposure, strengthen compliance, and signal a proactive commitment to user safety. Online platforms that host images, videos, or social content should:
If you have questions about the TAKE IT DOWN Act, reach out to the authors Aravind Swaminathan, Jake Heath, Meg Hennessey, David Curtis, Ryann McMurry, or Tom Zick.
[1] Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act
[2] Digital forgeries—the Act’s terminology for AI-generated deepfakes—have made it possible to fabricate pornographic images of individuals without their consent, even when no such original content exists.